1,678 research outputs found

    Longevity-Linked Life Annuities: A Bayesian Model Ensemble Pricing Approach

    Get PDF
    Participating longevity-linked life annuities (PLLA) are an interesting solution to manage systematic longevity risk in markets in which alternative risk management solutions are scarce and/or expensive and in which there are significant demand- and supply-side constraints that prevent individuals from annuitizing their retirement wealth. In this paper we revisit, complement and expand previous research on the design, valuation and willingness to pay for various index-type PLLA structures. Contrary to previous studies that use a single model to forecast mortality rates, we develop a novel approach based on a Bayesian Model Ensemble of generalised age-period-cohort stochastic mortality models. To determine which models received a greater or lesser weight in the final projections, we implemented a backtesting cross-validation approach. We use Taiwan (mortality, yield curve and stock market) data from 1980 to 2019 and adopt a longevity option decomposition valuation approach. The empirical results provide significant valuation and policy insights for building post-retirement income, particularly in Asian countries

    Forecasting mortality rates with Recurrent Neural Networks: A preliminary investigation using Portuguese data

    Get PDF
    Forecasts of age-specific mortality rates are a critical input in multiple research and policy areas such as assessing the overall health, well-being, and human development of a population and the pricing and risk management of life insurance contracts and longevity-linked securities. Model selection and model combination are currently the two competing approaches when modelling and forecasting mortality, often using statistical learning methods. This paper empirically investigates the predictive performance of Recurrent Neural Networks (RNN) with Long Short-Term Memory (LSTM) architecture in jointly modelling and multivariate time series forecasting of age-specific mortality rates across the entire lifespan. We empirically investigate different hyperparameter choices in three hidden layers LSTM models and compare the model’s forecasting accuracy with that produced by classical age-period and age-period-cohort stochastic mortality models. The empirical results obtained using data for Portugal suggest that the RNN with LSTM architecture can outperform traditional benchmarking methods. The LSTM architecture generates smooth and consistent forecasts of mortality rates at all ages and across years. The predictive accuracy of the LSTM network is higher for both sexes, significantly outperforming the benchmarks in the male population, an interesting result given the added difficulties posed by the mortality hump and higher variability in male survival functions. Further investigation considering other RNN architectures, calibration procedures, and sample datasets is necessary to confirm the robustness of deep learning methods in modelling human survival

    A non-parametric-based computationally efficient approach for credit scoring

    Get PDF
    Ashofteh, A., & Bravo, J. M. (2019). A non-parametric-based computationally efficient approach for credit scoring. In Atas da Conferencia da Associacao Portuguesa de Sistemas de Informacao 2019: 19ª Conferencia da Associacao Portuguesa de Sistemas de Informacao, CAPSI 2019 - 19th Conference of the Portuguese Association for Information Systems, CAPSI 2019; Lisboa; Portugal; 11 October 2019 through 12 October 2019 (pp. 19). (Atas da Conferencia da Associacao Portuguesa de Sistemas de Informacao).This research aimed at the case of credit scoring in risk management and presented the novel method for credit scoring to be used for default prediction. This study uses Kruskal-Wallis non-parametric statistic to form a computationally efficient credit-scoring model based on artificial neural network to study the impact on modelling performance. The findings show that new credit scoring methodology represents reasonable coefficient of determination and low false negative rate. It is computationally less expensive with high accuracy (AUC=0.99). Because of the recent respective of continued credit/behavior scoring, our study suggests to use this credit score for non-traditional data sources such as mobile phone data to study and reveal changes of client’s behavior during the time. This is the first study that develops a non-parametric credit scoring, which is able to reselect effective features for continued credit evaluation and weighted out by their level of contribution with a good diagnostic ability.publishersversionpublishe

    A Non-Parametric-Based Computationally Efficient Approach for Credit Scoring

    Get PDF
    This research aimed at the case of credit scoring in risk management and presented the novel method for credit scoring to be used for default prediction. This study uses Kruskal-Wallis non-parametric statistic to form a computationally efficient credit-scoring model based on artificial neural network to study the impact on modelling performance. The findings show that new credit scoring methodology represents reasonable coefficient of determination and low false negative rate. It is computationally less expensive with high accuracy (AUC=0.99). Because of the recent respective of continued credit/behavior scoring, our study suggests to use this credit score for non-traditional data sources such as mobile phone data to study and reveal changes of client’s behavior during the time. This is the first study that develops a non-parametric credit scoring, which is able to reselect effective features for continued credit evaluation and weighted out by their level of contribution with a good diagnostic ability

    A study on the quality of novel coronavirus (COVID-19) official datasets

    Get PDF
    Ashofteh, A., & Bravo, J. M. (2020). A study on the quality of novel coronavirus (COVID-19) official datasets. Statistical Journal of the IAOS, 36(2), 291-301. https://doi.org/10.3233/SJI-200674Policy makers depend on complex epidemiological models that are compelled to be robust, realistic, defendable and consistent with all relevant available data disclosed by official authorities which is deemed to have the highest quality standards. This paper analyses and compares the quality of official datasets available for COVID-19. We used comparative statistical analysis to evaluate the accuracy of data collection by a national (Chinese Center for Disease Control and Prevention) and two international (World Health Organization; European Centre for Disease Prevention and Control) organisations based on the value of systematic measurement errors. We combined excel files, text mining techniques and manual data entries to extract the COVID-19 data from official reports and to generate an accurate profile for comparisons. The findings show noticeable and increasing measurement errors in the three datasets as the pandemic outbreak expanded and more countries contributed data for the official repositories, raising data comparability concerns and pointing to the need for better coordination and harmonized statistical methods. The study offers a COVID-19 combined dataset and dashboard with minimum systematic measurement errors, and valuable insights into the potential problems in using databanks without carefully examining the metadata and additional documentation that describe the overall context of data.publishersversionpublishe

    a New Scientific Paradigm of Information and Knowledge Development in National Statistical Systems

    Get PDF
    Ashofteh, A., & Bravo, J. M. (2021). Data Science Training for Official Statistics: a New Scientific Paradigm of Information and Knowledge Development in National Statistical Systems. Statistical Journal of the IAOS, 37(3), 771 – 789. https://doi.org/10.3233/SJI-210841The ability to incorporate new and Big Data sources and to benefit from emerging technologies such as Web Technologies, Remote Data Collection methods, User Experience Platforms, and Trusted Smart Statistics will become increasingly important in producing and disseminating official statistics. The skills and competencies required to automate, analyse, and optimize such complex systems are often not part of the traditional skill set of most National Statistical Offices. The adoption of these technologies requires new knowledge, methodologies and the upgrading of the quality assurance framework, technology, security, privacy, and legal matters. However, there are methodological challenges and discussions among scholars about the diverse methodical confinement and the wide array of skills and competencies considered relevant for those working with big data at NSOs. This paper develops a Data Science Model for Official Statistics (DSMOS), graphically summarizing the role of data science in statistical business processes. The model combines data science, existing scientific paradigms, and trusted smart statistics, and develops around a restricted number of constructs. We considered a combination of statistical engineering, data engineering, data analysis, software engineering and soft skills such as statistical thinking, statistical literacy and specific knowledge of official statistics and dissemination of official statistics products as key requirements of data science in official statistics. We then analyse and discuss the educational requirements of the proposed model, clarifying their contribution, interactions, and current and future importance in official statistics. The DSMOS was validated through a quantitative method, using a survey addressed to experts working at the European statistical systems. The empirical results show that the core competencies considered relevant for the DSMOS include acquisition and processing capabilities related to Statistics, high-frequency data, spatial data, Big Data, and microdata/nano-data, in addition to problem-solving skills, Spatio-temporal modelling, machine learning, programming with R and SAS software, Data visualisation using novel technologies, Data and statistical literacy, Ethics in Official Statistics, New data methodologies, New data quality tools, standards and frameworks for official statistics. Some disadvantages and vulnerabilities are also addressed in the paper.publishersversionpublishe

    Ensemble Methods for Consumer Price Inflation Forecasting

    Get PDF
    Inflation forecasting is one of the central issues in micro and macroeconomics. Standard forecasting methods tend to follow a winner-take-all approach by which, for each time series, a single believed to be the best method is chosen from a pool of competing models. This paper investigates the predictive accuracy of a metalearning strategy called Arbitrated Dynamic Ensemble (ADE) in inflation forecasting using United States data. The findings show that: i) the SARIMA model exhibits the best average rank relative to ADE and competing state-of-the-art model combination and metalearning methods; ii) the ADE methodology presents a better average rank compared to widely used model combination approaches, including the original Arbitrating approach, Stacking, Simple averaging, Fixed Share, or weighted adaptive combination of experts; iii) the ADE approach benefits from combining the base-learners as opposed to selecting the best forecasting model or using all experts; iv) the method is sensitive to the aggregation (weighting) mechanism

    Career breaks, broken pensions? Long-run effects of early and late-career unemployment spells on pension entitlements

    Get PDF
    Bravo, J. M., & Herce, J. A. (2022). Career breaks, broken pensions? Long-run effects of early and late-career unemployment spells on pension entitlements. Journal of Pension Economics and Finance, 21(2), 191 - 217. [Advanced online publication on 22 July 2020]. Doi: https://doi.org/10.1017/S1474747220000189Unemployment periods and other career breaks have long-term scarring effects on future labour market possibilities, permanently affecting workers' retirement income and standard of living as pensioners. Previous literature has focused on the impact of job loss on working careers with little attention to its impact on pension wealth, particularly the extent to which longevity heterogeneity amplifies unemployment scars. This paper investigates the effect of single and multiple unemployment spells on the lifetime pension entitlements of earnings-related contributory pension schemes, considering the timing and duration of breaks, alternative lifecycle labour earnings profiles, scarring and restoration effects on labour market re-entry, the existence of pension credits and pension accruals for periods spent outside the labour market, longevity heterogeneity, and the accumulation and decumulation redistributive features of the pension scheme. Pension entitlements are estimated using a backward-looking simulation approach based on the actual Portuguese public pension system rules and stylized labour market profiles identified in the SHARE Job Episodes Panel data using a sequence analysis. Longevity heterogeneity is modelled using a stochastic mortality model with a frailty model. Our results show that the timing and duration of unemployment periods is critical, that scarring effects amplify pension wealth losses, that minimum pension provisions, pension credits and pension scheme redistributive features can partially mitigate the impact of unemployment periods on future entitlements, and that the presence of positive correlation between lifetime income and longevity career breaks can amplify the asymmetry in the distribution of pension entitlements across income groups.authorsversionpublishe

    A Novel Layered Learning Approach for Forecasting Respiratory Disease Excess Mortality during the COVID-19 pandemic

    Get PDF
    Forecasting model selection and model combination are the two contending approaches in the time series forecasting literature. Ensemble learning is useful for addressing a given predictive task by different predictive models when direct mapping from inputs to outputs is inaccurate. We adopt a layered learning approach to an ensemble learning strategy to solve the predictive tasks with improved predictive performance and take advantage of multiple learning processes into an ensemble model. In this proposed strategy, we build each model with a specific holdout and make the ensemble model of time series with a dynamic selection approach. For the experimental section, we studied more than twelve thousand observations in a portfolio of 61-time series of reported respiratory disease deaths to show the amount of improvement in predictive performance of excess mortality. Then we compare the forecasting outcome of our model with the corresponding total deaths of COVID-19 for selected countries

    Immunization strategies for funding multiple inflation-linked retirement income benefits

    Get PDF
    Simões, C., Oliveira, L., & Bravo, J. M. (2021). Immunization strategies for funding multiple inflation-linked retirement income benefits. Risks, 9(4), [60]. https://doi.org/10.3390/risks9040060Protecting against unexpected yield curve, inflation, and longevity shifts are some of the most critical issues institutional and private investors must solve when managing post-retirement income benefits. This paper empirically investigates the performance of alternative immunization strategies for funding targeted multiple liabilities that are fixed in timing but random in size (inflation-linked), i.e., that change stochastically according to consumer price or wage level indexes. The immunization procedure is based on a targeted minimax strategy considering the M-Absolute as the interest rate risk measure. We investigate to what extent the inflation-hedging properties of ILBs in asset liability management strategies targeted to immunize multiple liabilities of random size are superior to that of nominal bonds. We use two alternative datasets comprising daily closing prices for U.S. Treasuries and U.S. inflation-linked bonds from 2000 to 2018. The immunization performance is tested over 3-year and 5-year investment horizons, uses real and not simulated bond data and takes into consideration the impact of transaction costs in the performance of immunization strategies and in the selection of optimal investment strategies. The results show that the multiple liability immunization strategy using inflation-linked bonds outperforms the equivalent strategy using nominal bonds and is robust even in a nearly zero interest rate scenario. These results have important implications in the design and structuring of ALM liability-driven investment strategies, particularly for retirement income providers such as pension schemes or life insurance companies.publishersversionpublishe
    corecore